Boost your vocab and unleash your potential!

Recently viewed words:
Definitions of west germany
  1. noun
    a republic in north central Europe on the North Sea; established in 1949 from the zones of Germany occupied by the British and French and Americans after the German defeat; reunified with East Germany in 1990
Explanation of west germany
My lists:
Recently viewed words: